Re: error status 139 - Mailing list pgsql-admin

From Steven Lane
Subject Re: error status 139
Date
Msg-id v03007802b781d595cfa4@[65.15.153.184]
Whole thread Raw
In response to Re: error status 139  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-admin
Hello all:

I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the row and try
again.

Is there any way to import records that could just skip the bad ones and
notify me which ones they are? Loading this much data is pretty
time-consuming, especially when I keep having to repeat it to find each new
bad row. Is there a better way?

-- sgl



pgsql-admin by date:

Previous
From: Steven Lane
Date:
Subject: Better copy/import
Next
From: lobet_romuald@my-deja.com (Romuald Lobet)
Date:
Subject: Re: What CASE tools and clients for Postgres?